Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration
نویسندگان
چکیده
Visual and inertial sensors, in combination, are able to provide accurate motion estimates and are well-suited for use in many robot navigation tasks. However, correct data fusion, and hence overall performance, depends on careful calibration of the rigid body transform between the sensors. Obtaining this calibration information is typically difficult and time-consuming, and normally requires additional equipment. In this paper we describe an algorithm, based on the unscented Kalman filter, for self-calibration of the transform between a camera and an inertial measurement unit (IMU). Our formulation rests on a differential geometric analysis of the observability of the camera-IMU system; this analysis shows that the sensor-to-sensor transform, the IMU gyroscope and accelerometer biases, the local gravity vector, and the metric scene structure can be recovered from camera and IMU measurements alone. While calibrating the transform we simultaneously localize the IMU and build a map of the surroundings – all without additional hardware or prior knowledge about the environment in which a robot is operating. We present results from simulation studies and from experiments with a monocular camera and a low-cost IMU, which demonstrate accurate estimation of both the calibration parameters and the local scene structure.
منابع مشابه
A New Approach to Self-Localization for Mobile Robots Using Sensor Data Fusion
This paper proposes a new approach for calibration of dead reckoning process. Using the well-known UMBmark (University of Michigan Benchmark) is not sufficient for a desirable calibration of dead reckoning. Besides, existing calibration methods usually require explicit measurement of actual motion of the robot. Some recent methods use the smart encoder trailer or long range finder sensors such ...
متن کاملA Hierarchical SLAM/GPS/INS Sensor Fusion with WLFP for Flying Robo-SAR's Navigation
In this paper, we present the results of a hierarchical SLAM/GPS/INS/WLFP sensor fusion to be used in navigation system devices. Due to low quality of the inertial sensors, even a short-term GPS failure can lower the integrated navigation performance significantly. In addition, in GPS denied environments, most navigation systems need a separate assisting resource, in order to increase the avail...
متن کاملMulti-Focus Image Fusion in DCT Domain using Variance and Energy of Laplacian and Correlation Coefficient for Visual Sensor Networks
The purpose of multi-focus image fusion is gathering the essential information and the focused parts from the input multi-focus images into a single image. These multi-focus images are captured with different depths of focus of cameras. A lot of multi-focus image fusion techniques have been introduced using considering the focus measurement in the spatial domain. However, the multi-focus image ...
متن کاملPIRVS: An Advanced Visual-Inertial SLAM System with Flexible Sensor Fusion and Hardware Co-Design
In this paper, we present the PerceptIn Robotics Vision System (PIRVS) system, a visual-inertial computing hardware with embedded simultaneous localization and mapping (SLAM) algorithm. The PIRVS hardware is equipped with a multi-core processor, a global-shutter stereo camera, and an IMU with precise hardware synchronization. The PIRVS software features a novel and flexible sensor fusion approa...
متن کاملA multi-hop PSO based localization algorithm for wireless sensor networks
A sensor network consists of a large number of sensor nodes that are distributed in a large geographic environment to collect data. Localization is one of the key issues in wireless sensor network researches because it is important to determine the location of an event. On the other side, finding the location of a wireless sensor node by the Global Positioning System (GPS) is not appropriate du...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- I. J. Robotics Res.
دوره 30 شماره
صفحات -
تاریخ انتشار 2011